Speech-centric Multimodal Interaction for Easy-to-access Online Services - A Personal Life Assistant for the Elderly

نویسندگان

  • António J. S. Teixeira
  • Annika Hämäläinen
  • Jairo Avelar
  • Nuno Almeida
  • Géza Németh
  • Tibor Fegyó
  • Csaba Zainkó
  • Tamás Gábor Csapó
  • Bálint Tóth
  • André Oliveira
  • José Miguel Salles Dias
چکیده

The PaeLife project is a European industry-academia collaboration whose goal is to provide the elderly with easy access to online services that make their life easier and encourage their continued participation in the society. To reach this goal, the project partners are developing a multimodal virtual personal life assistant (PLA) offering a wide range of services from weather information to social networking. This paper presents the multimodal architecture of the PLA, the services provided by the PLA, and the work done in the area of speech input and output modalities, which play a key role in the application. © 2013 The Authors. Published by Elsevier B.V. Selection and peer-review under responsibility of the Scientific Programme Committee of the 5th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion (DSAI 2013).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A speech-centric perspective for human-computer interface

AbstradSpeech technology has been playing a central role in enhancing human-machine interactions, especially for small devices for which CUI has obvious limitations. The speechcentric perspective for hnman-compnter interface advanced in this paper derives from the view that speech is the only natural and expressive modality to enable people to access information from and to interact with any de...

متن کامل

Mipad: a Multimodal Interaction Prototype

Dr. Who is a Microsoft’s research project aiming at creating a speech-centric multimodal interaction framework, which serves as the foundation for the .NET natural user interface. MiPad is the application prototype that demonstrates compelling user advantages for wireless Personal Digital Assistant (PDA) devices, MiPad fully integrates continuous speech recognition (CSR) and spoken la nguage un...

متن کامل

Mipad: a Multimodal Interaction Prototype

Dr. Who is a Microsoft’s research project aiming at creating a speech-centric multimodal interaction framework, which serves as the foundation for the .NET natural user interface. MiPad is the application prototype that demonstrates compelling user advantages for wireless Personal Digital Assistant (PDA) devices, MiPad fully integrates continuous speech recognition (CSR) and spoken la nguage un...

متن کامل

Integrated User Interfaces for the Home Environment

In this paper, we describe the development of a hand held personal home assistent capable of controlling a wide range of electronic home devices. A demonstrator of the system was developed in the European project TIDE HEPHAISTOS (Home Environment Private Help AssISTant fOr elderly and diSabled). Its multimodal user interface is based on a coloured high resolution touch screen extended with spee...

متن کامل

Ontology-Based Discourse Understanding for a Persistent Meeting Assistant

Introduction This paper describes current research efforts towards automatic understanding of multimodal discourse for a persistent personal office assistant. The assistant aids users in performing office-related tasks such as coordinating schedules with other users, providing relevant information for completing tasks, making a record of meetings, and assisting in fulfilling the decisions made ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013